Goto

Collaborating Authors

 multifaceted uncertainty estimation


Multifaceted Uncertainty Estimation for Label-Efficient Deep Learning

Neural Information Processing Systems

We present a novel multi-source uncertainty prediction approach that enables deep learning (DL) models to be actively trained with much less labeled data. By leveraging the second-order uncertainty representation provided by subjective logic (SL), we conduct evidence-based theoretical analysis and formally decompose the predicted entropy over multiple classes into two distinct sources of uncertainty: vacuity and dissonance, caused by lack of evidence and conflict of strong evidence, respectively. The evidence based entropy decomposition provides deeper insights on the nature of uncertainty, which can help effectively explore a large and high-dimensional unlabeled data space. We develop a novel loss function that augments DL based evidence prediction with uncertainty anchor sample identification. The accurately estimated multiple sources of uncertainty are systematically integrated and dynamically balanced using a data sampling function for label-efficient active deep learning (ADL). Experiments conducted over both synthetic and real data and comparison with competitive AL methods demonstrate the effectiveness of the proposed ADL model.


Review for NeurIPS paper: Multifaceted Uncertainty Estimation for Label-Efficient Deep Learning

Neural Information Processing Systems

Weaknesses: I have several concerns regarding this work. Firstly, I'm not entirely convinced by the need to introduce an evidence-based Dempster Schafer / Subjective logic framework. The proposed decomposition into vacuity and dissonance is essentially the same thing as decomposing into epistemic uncertainty and aleatoric uncertainty. Why not consider the tractable closed form mutual information decomposition into total, epistemic and aleatoric uncertainties which was derived in previous work? I believe that decomposition would, broadly speaking, have many of the same quantities.


Review for NeurIPS paper: Multifaceted Uncertainty Estimation for Label-Efficient Deep Learning

Neural Information Processing Systems

Incorporating notions of "vacuity" and "dissonance" in active learning is novel, and reviewers are convinced of its improvements. The empirical evaluation regarding certain baselines and scale are a bit lacking, and while the work may be sufficient for acceptance, the reviews all believe the work would at least benefit from further discussion to related work in the paper revision: Prior Networks and Noise Contrastive Priors. In addition, promoting more discussion on the need for Dempster-Schafer as R1 asks, is very welcome.


Multifaceted Uncertainty Estimation for Label-Efficient Deep Learning

Neural Information Processing Systems

We present a novel multi-source uncertainty prediction approach that enables deep learning (DL) models to be actively trained with much less labeled data. By leveraging the second-order uncertainty representation provided by subjective logic (SL), we conduct evidence-based theoretical analysis and formally decompose the predicted entropy over multiple classes into two distinct sources of uncertainty: vacuity and dissonance, caused by lack of evidence and conflict of strong evidence, respectively. The evidence based entropy decomposition provides deeper insights on the nature of uncertainty, which can help effectively explore a large and high-dimensional unlabeled data space. We develop a novel loss function that augments DL based evidence prediction with uncertainty anchor sample identification. The accurately estimated multiple sources of uncertainty are systematically integrated and dynamically balanced using a data sampling function for label-efficient active deep learning (ADL).